Large Margin Low Rank Tensor Analysis
نویسندگان
چکیده
We present a supervised model for tensor dimensionality reduction, which is called large margin low rank tensor analysis (LMLRTA). In contrast to traditional vector representation-based dimensionality reduction methods, LMLRTA can take any order of tensors as input. And unlike previous tensor dimensionality reduction methods, which can learn only the low-dimensional embeddings with a priori specified dimensionality, LMLRTA can automatically and jointly learn the dimensionality and the low-dimensional representations from data. Moreover, LMLRTA delivers low rank projection matrices, while it encourages data of the same class to be close and of different classes to be separated by a large margin of distance in the low-dimensional tensor space. LMLRTA can be optimized using an iterative fixed-point continuation algorithm, which is guaranteed to converge to a local optimal solution of the optimization problem. We evaluate LMLRTA on an object recognition application, where the data are represented as 2D tensors, and a face recognition application, where the data are represented as 3D tensors. Experimental results show the superiority of LMLRTA over state-of-the-art approaches.
منابع مشابه
Higher rank Support Tensor Machines for visual recognition
This work addresses the two class classification problem within the tensorbased large margin classification paradigm. To this end, we formulate the higher rank Support Tensor Machines (STMs), in which the parameters defining the separating hyperplane form a tensor (tensorplane) that is constrained to be the sum of rank one tensors. Subsequently, we propose two extensions in which the separating...
متن کاملReweighted Low-Rank Tensor Decomposition based on t-SVD and its Applications in Video Denoising
The t-SVD based Tensor Robust Principal Component Analysis (TRPCA) decomposes low rank multi-linear signal corrupted by gross errors into low multi-rank and sparse component by simultaneously minimizing tensor nuclear norm and l1 norm. But if the multi-rank of the signal is considerably large and/or large amount of noise is present, the performance of TRPCA deteriorates. To overcome this proble...
متن کاملOnline Robust Low-Rank Tensor Learning
The rapid increase of multidimensional data (a.k.a. tensor) like videos brings new challenges for low-rank data modeling approaches such as dynamic data size, complex high-order relations, and multiplicity of low-rank structures. Resolving these challenges require a new tensor analysis method that can perform tensor data analysis online, which however is still absent. In this paper, we propose ...
متن کاملFundamental Tensor Operations for Large-Scale Data Analysis in Tensor Train Formats
We discuss extended definitions of linear and multilinear operations such as Kronecker, Hadamard, and contracted products, and establish links between them for tensor calculus. Then we introduce effective low-rank tensor approximation techniques including Candecomp/Parafac (CP), Tucker, and tensor train (TT) decompositions with a number of mathematical and graphical representations. We also pro...
متن کاملRegularized Computation of Approximate Pseudoinverse of Large Matrices Using Low-Rank Tensor Train Decompositions
We propose a new method for low-rank approximation of Moore-Penrose pseudoinverses (MPPs) of large-scale matrices using tensor networks. The computed pseudoinverses can be useful for solving or preconditioning large-scale overdetermined or underdetermined systems of linear equations. The computation is performed efficiently and stably based on the modified alternating least squares (MALS) schem...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural computation
دوره 26 4 شماره
صفحات -
تاریخ انتشار 2014